7,063 research outputs found
Recommended from our members
Software integration testing based on communication coverage criteria and partial model generation
This paper considers the problem of integration testing the components of a timed distributed software system. We assume that communication between the components is specified using timed interface automata and use computational tree logic (CTL) to define communication-based coverage criteria that refer to send- and receive-statements and communication paths. The proposed method enables testers to focus during component integration on such parts of the specification, e.g. behaviour specifications or Markovian usage models, that are involved in the communication between components to be integrated. A more specific application area of this approach is the integration of test-models, e.g. a transmission gear can be tested based on separated models for the driver behaviour, the engine condition, and the mechanical and hydraulical transmission states. Given such a state-based specification of a distributed system and a concrete coverage goal, a model checker is used in order to determine the coverage or generate test sequences that achieve the goal. Given the generated test sequences we derive a partial test-model of the components from which the test sequences are derived. The partial model can be used to drive further testing and can also be used as the basis for producing additional partial models in incremental integration testing. While the process of deriving the test sequences could suffer from a combinatorial explosion, the effort required to generate the partial model is polynomial in the number of test sequences and their length. Thus, where it is not feasible to produce test sequences that achieve a given type of coverage it is still possible to produce a partial model on the basis of test sequences generated to achieve some other criterion. As a result, the process of generating a partial model has the potential to scale to large industrial software systems. While a particular model checker, UPPAAL, was used, it should be relatively straightforward to adapt the approach for use with other CTL based model checkers. A potential additional benefit of the approach is that it provides a visual description of the state-based testing of distributed systems, which may be beneficial in other contexts such as education and comprehension
Interpreting random forest models using a feature contribution method
Model interpretation is one of the key aspects of the model evaluation process. The explanation of the relationship between model variables and outputs is easy for statistical models, such as linear regressions, thanks to the availability of model parameters and their statistical significance. For “black box” models, such as random forest, this information is hidden inside the model structure. This work presents an approach for computing feature contributions for random forest classification models. It allows for the determination of the influence of each variable on the model prediction for an individual instance. Interpretation of feature contributions for two UCI benchmark datasets shows the potential of the proposed methodology. The robustness of results is demonstrated through an extensive analysis of feature contributions calculated for a large number of generated random forest models
Effect of lamotrigine on cognition in children with epilepsy
Background: Lamotrigine does not affect cognition in healthy adult volunteers or adult patients with epilepsy, but its effect on cognition in children is uncertain.//
Objective: To compare the effect of lamotrigine and placebo on cognition in children with well-controlled or mild epilepsy.//
Method: In a double-blind, placebo-controlled, crossover study, 61 children with well-controlled or mild epilepsy were randomly assigned to add-on therapy with either lamotrigine followed by placebo or placebo followed by lamotrigine. Each treatment phase was 9 weeks, the crossover period 5 weeks. A neuropsychological test battery was performed during EEG monitoring at baseline and at the end of placebo and drug phases. The paired Student’ t test was used for statistical analysis for neuropsychological data (two tailed) with a p value of 0.01 considered significant. Carryover and period effect were analyzed with generalized linear modeling (SPSS 10).//
Results: Forty-eight children completed the study. Seizure frequency was similar during both treatment phases. No significant difference was found in continuous performance, binary choice reaction time, verbal and nonverbal recognition, computerized visual searching task, verbal and spatial delayed recognition, and verbal and nonverbal working memory between placebo and lamotrigine treatment phase. There was no significant carryover and period effect when corrected for randomization.//
Conclusion: Lamotrigine exhibits no clinically significant cognitive effects in adjunctive therapy for children with epilepsy
Photon-number-resolution with sub-30-ps timing using multi-element superconducting nanowire single photon detectors
A photon-number-resolving detector based on a four-element superconducting
nanowire single photon detector is demonstrated to have sub-30-ps resolution in
measuring the arrival time of individual photons. This detector can be used to
characterize the photon statistics of non-pulsed light sources and to mitigate
dead-time effects in high-speed photon counting applications. Furthermore, a
25% system detection efficiency at 1550 nm was demonstrated, making the
detector useful for both low-flux source characterization and high-speed
photon-counting and quantum communication applications. The design, fabrication
and testing of this detector are described, and a comparison between the
measured and theoretical performance is presented.Comment: 13 pages, 5 figure
Air Fraction Correction Optimisation in PET Imaging of Lung Disease
Accurate quantification of radiopharmaceutical uptake from lung PET/CT is challenging due to large variations in fractions of tissue, air, blood and water. Air fraction correction (AFC) uses voxel-wise air fractions, which can be determined from the CT acquired for attenuation correction (AC). However, resolution effects can cause artefacts in either of these corrections. In this work, we hypothesise that the resolution of the CT image used for AC should match that of the intrinsic resolution of the PET scanner but should approximate the reconstructed PET image resolution for AFC. Simulations and reconstructions were performed with the Synergistic Image Reconstruction Framework (SIRF) using phantoms with inhomogeneous attenuation (mu) maps, mimicking the densities observed in lung pathologies. Poisson noise was added to the projection data prior to OSEM reconstruction. AC was performed with a smoothed mu-map, the full-width-half-maximum (FWHM) of the 3D Gaussian kernel was varied (0 - 10 mm). Post-filters were applied to the reconstructed AC images (FWHM: 0 - 8 mm). The simulated mu-map was independently convolved with another set of 3D Gaussian kernels, of varying FWHM (0 - 12 mm), for AFC. The coefficient of variation (CV) in the lung region, designed to be homogeneous post-AFC with optimised kernels, and the mean AFC-standardized uptake value (AFC-SUV) in the regions of simulated pathologies were determined. The spatial resolution of each post-filtered image was determined via a point-source insertion-and-subtraction method on noiseless data. Results showed that the CV was minimised when the kernel applied to the mu-map for AC matched that for the simulated PET scanner and the kernel applied to the mu-map for AFC matched the spatial resolution of the reconstructed PET image. This was observed for all post-reconstruction filters and supports the hypothesis. Initial results from Monte Carlo simulations validate these findings
Practical Evaluation of Lempel-Ziv-78 and Lempel-Ziv-Welch Tries
We present the first thorough practical study of the Lempel-Ziv-78 and the
Lempel-Ziv-Welch computation based on trie data structures. With a careful
selection of trie representations we can beat well-tuned popular trie data
structures like Judy, m-Bonsai or Cedar
The Utility of a High-intensity Exercise Protocol to Prospectively Assess ACL Injury Risk.
This study investigated the utility of a 5-min high-intensity exercise protocol (SAFT(5)) to include in prospective cohort studies investigating ACL injury risk. 15 active females were tested on 2 occasions during which their non-dominant leg was analysed before SAFT(5) (PRE), immediately after (POST0), 15 min after (POST15), and 30 min after (POST30). On the first occasion, testing included 5 maximum isokinetic contractions for eccentric and concentric hamstring and concentric quadriceps and on the second occasion, 3 trials of 2 landing tasks (i. e., single-leg hop and drop vertical jump) were conducted. Results showed a reduced eccentric hamstring peak torque at POST0, POST15 and POST30 (p<0.05) and a reduced functional HQ ratio (Hecc/Qcon) at POST15 and POST30 (p<0.05). Additionally, a more extended knee angle at POST30 (p<0.05) and increased knee internal rotation angle at POST0 and POST15 (p<0.05) were found in a single-leg hop. SAFT(5) altered landing strategies associated with increased ACL injury risk and similar to observations from match simulations. Our findings therefore support the utility of a high-intensity exercise protocol such as SAFT(5) to strengthen injury screening tests and to include in prospective cohort studies where time constraints apply
Towards Constructing Fully Homomorphic Encryption without Ciphertext Noise from Group Theory
In CRYPTO 2008, one year earlier than Gentry\u27s pioneering \lq\lq bootstrapping\u27\u27 technique on constructing the first fully homomorphic encryption (FHE) scheme, Ostrovsky and Skeith III had suggested a completely different approach towards achieving FHE. Namely, they showed that the operator can be realized in some \emph{non-commutative} groups; consequently, in combination with the operator realized in such a group, homomorphically encrypting the elements of the group will yield an FHE scheme. However, no observations on how to homomorphically encrypt the group elements were presented in their paper, and there have been no follow-up studies in the literature based on their approach.
The aim of this paper is to exhibit more clearly what is sufficient and what seems to be effective for constructing FHE schemes based on their approach. First, we prove that it is sufficient to find a surjective homomorphism between finite groups for which bit operators are realized in and the elements of the kernel of are indistinguishable from the general elements of . Secondly, we propose new methodologies to realize bit operators in some groups, which enlarges the possibility of the group to be used in our framework. Thirdly, we give an observation that a naive approach using matrix groups would never yield secure FHE due to an attack utilizing the \lq\lq linearity\u27\u27 of the construction. Then we propose an idea to avoid such \lq\lq linearity\u27\u27 by using combinatorial group theory, and give a prototypical but still \emph{incomplete} construction in the sense that it is \lq\lq non-compact\u27\u27 FHE, i.e., the ciphertext size is unbounded (though the ciphertexts are noise-free as opposed to the existing FHE schemes). Completely realizing FHE schemes based on our proposed framework is left as a future research topic
A Generalization of the Goldberg-Sachs Theorem and its Consequences
The Goldberg-Sachs theorem is generalized for all four-dimensional manifolds
endowed with torsion-free connection compatible with the metric, the treatment
includes all signatures as well as complex manifolds. It is shown that when the
Weyl tensor is algebraically special severe geometric restrictions are imposed.
In particular it is demonstrated that the simple self-dual eigenbivectors of
the Weyl tensor generate integrable isotropic planes. Another result obtained
here is that if the self-dual part of the Weyl tensor vanishes in a Ricci-flat
manifold of (2,2) signature the manifold must be Calabi-Yau or symplectic and
admits a solution for the source-free Einstein-Maxwell equations.Comment: 14 pages. This version matches the published on
Quasiperiodicity and non-computability in tilings
We study tilings of the plane that combine strong properties of different
nature: combinatorial and algorithmic. We prove existence of a tile set that
accepts only quasiperiodic and non-recursive tilings. Our construction is based
on the fixed point construction; we improve this general technique and make it
enforce the property of local regularity of tilings needed for
quasiperiodicity. We prove also a stronger result: any effectively closed set
can be recursively transformed into a tile set so that the Turing degrees of
the resulted tilings consists exactly of the upper cone based on the Turing
degrees of the later.Comment: v3: the version accepted to MFCS 201
- …